109 research outputs found

    SSA of biomedical signals: A linear invariant systems approach

    Get PDF
    Singular spectrum analysis (SSA) is considered from a linear invariant systems perspective. In this terminology, the extracted components are considered as outputs of a linear invariant system which corresponds to finite impulse response (FIR) filters. The number of filters is determined by the embedding dimension.We propose to explicitly define the frequency response of each filter responsible for the selection of informative components. We also introduce a subspace distance measure for clustering subspace models. We illustrate the methodology by analyzing lectroencephalograms (EEG).FCT - PhD scholarship (SFRH/BD/28404/2006)FCT - PhD scholarship (SFRH/BD/48775/2008

    Identifying evoked potential response patterns using independent component analysis and unsupervised learning

    Get PDF
    Independent Component Analysis(ICA) is a pre-processing step widely used in brain studies. One of the most common problems in artifact elimination or brain activity related studies is the ordering and identification of the independent components(ICs). In this work, a novel procedure is proposed which combines ICA decomposition at trial level with an unsupervised learning algorithm (K-means) at participant level in order to enhance the related signal patterns which might represent interesting brain waves. The feasibility of this methodology is evaluated with EEG data acquired with participants performing on the Halstead Category Test. The analysis shows that it is possible to find the Feedback Error Negativity (FRN) Potential at single-trial level and relate its characteristics with the performance of the participant based on their knowledge of the abstract principle underlying the task.info:eu-repo/semantics/publishedVersio

    Linear Invariant Systems Theory for Signal Enhancement

    Get PDF
    This paper discusses a linear time invariant (LTI) systems approach to signal enhancement via projective subspace techniques. It provides closed form expressions for the frequency response of data adaptive finite impulse response eigenfilters. An illustrative example using speech enhancement is also presented.Este artigo apresenta a aplicação da teoria de sistemas lineares invariantes no tempo (LTI) na análise de técnicas de sub-espaço. A resposta em frequência dos filtros resultantes da decomposição em valores singulares é obtida aplicando as propriedades dos sistemas LTI

    dAMUSE : a new tool for denoising and blind source separation

    Get PDF
    In this work a generalized version of AMUSE, called dAMUSE is proposed. The main modification consists in embedding the observed mixed signals in a high-dimensional feature space of delayed coordinates. With the embedded signals a matrix pencil is formed and its generalized eigendecomposition is computed similar to the algorithm AMUSE. We show that in this case the uncorrelated output signals are filtered versions of the unknown source signals. Further, denoising the data can be achieved conveniently in parallel with the signal separation. Numerical simulations using artificially mixed signals are presented to show the performance of the method. Further results of a heart rate variability (HRV) study are discussed showing that the output signals are related with LF (low frequency) and HF (high frequency) fluctuations. Finally, an application to separate artifacts from 2D NOESY NMR spectra and to denoise the reconstructed artefact-free spectra is presented also.info:eu-repo/semantics/publishedVersio

    Quasi-stationary distributions for the Domany-Kinzel stochastic cellular automaton

    Full text link
    We construct the {\it quasi-stationary} (QS) probability distribution for the Domany-Kinzel stochastic cellular automaton (DKCA), a discrete-time Markov process with an absorbing state. QS distributions are derived at both the one- and two-site levels. We characterize the distribuitions by their mean, and various moment ratios, and analyze the lifetime of the QS state, and the relaxation time to attain this state. Of particular interest are the scaling properties of the QS state along the critical line separating the active and absorbing phases. These exhibit a high degree of similarity to the contact process and the Malthus-Verhulst process (the closest continuous-time analogs of the DKCA), which extends to the scaling form of the QS distribution.Comment: 15 pages, 9 figures, submited to PR

    Denoising using local projective subspace methods

    Get PDF
    In this paper we present denoising algorithms for enhancing noisy signals based on Local ICA (LICA), Delayed AMUSE (dAMUSE) and Kernel PCA (KPCA). The algorithm LICA relies on applying ICA locally to clusters of signals embedded in a high-dimensional feature space of delayed coordinates. The components resembling the signals can be detected by various criteria like estimators of kurtosis or the variance of autocorrelations depending on the statistical nature of the signal. The algorithm proposed can be applied favorably to the problem of denoising multi-dimensional data. Another projective subspace denoising method using delayed coordinates has been proposed recently with the algorithm dAMUSE. It combines the solution of blind source separation problems with denoising efforts in an elegant way and proofs to be very efficient and fast. Finally, KPCA represents a non-linear projective subspace method that is well suited for denoising also. Besides illustrative applications to toy examples and images, we provide an application of all algorithms considered to the analysis of protein NMR spectra.info:eu-repo/semantics/publishedVersio

    ERP correlates of error processing during performance on the HalsteadCategory Test

    Get PDF
    The Halstead Category Test (HCT) is a neuropsychological test that measures a person's ability to formulate and apply abstract principles. Performance must be adjusted based on feedback after each trial and errors are common until the underlying rules are discovered. Event-related potential (ERP) studies associated with the HCT are lacking. This paper demonstrates the use of amethodology inspired on Singular SpectrumAnalysis (SSA) applied to EEG signals, to remove high amplitude ocular andmovement artifacts during performance on the test. This filtering technique introduces no phase or latency distortions, with minimum loss of relevant EEG information. Importantly, the test was applied in its original clinical format, without introducing adaptations to ERP recordings. After signal treatment, the feedback-related negativity (FRN) wave, which is related to error-processing, was identified. This component peaked around 250ms, after feedback, in fronto-central electrodes. As expected, errors elicited more negative amplitudes than correct responses. Results are discussed in terms of the increased clinical potential that coupling ERP informationwith behavioral performance data can bring to the specificity of theHCT in diagnosing different types of impairment in frontal brain function.info:eu-repo/semantics/publishedVersio

    Evidence for a mixed mass composition at the `ankle' in the cosmic-ray spectrum

    Get PDF
    We report a first measurement for ultra-high energy cosmic rays of the correlation between the depth of shower maximum and the signal in the water Cherenkov stations of air-showers registered simultaneously by the fluorescence and the surface detectors of the Pierre Auger Observatory. Such a correlation measurement is a unique feature of a hybrid air-shower observatory with sensitivity to both the electromagnetic and muonic components. It allows an accurate determination of the spread of primary masses in the cosmic-ray flux. Up till now, constraints on the spread of primary masses have been dominated by systematic uncertainties. The present correlation measurement is not affected by systematics in the measurement of the depth of shower maximum or the signal in the water Cherenkov stations. The analysis relies on general characteristics of air showers and is thus robust also with respect to uncertainties in hadronic event generators. The observed correlation in the energy range around the `ankle' at lg(E/eV)=18.519.0\lg(E/{\rm eV})=18.5-19.0 differs significantly from expectations for pure primary cosmic-ray compositions. A light composition made up of proton and helium only is equally inconsistent with observations. The data are explained well by a mixed composition including nuclei with mass A>4A > 4. Scenarios such as the proton dip model, with almost pure compositions, are thus disfavoured as the sole explanation of the ultrahigh-energy cosmic-ray flux at Earth.Comment: Published version. Added journal reference and DOI. Added Report Numbe

    Atmospheric effects on extensive air showers observed with the Surface Detector of the Pierre Auger Observatory

    Get PDF
    Atmospheric parameters, such as pressure (P), temperature (T) and density, affect the development of extensive air showers initiated by energetic cosmic rays. We have studied the impact of atmospheric variations on extensive air showers by means of the surface detector of the Pierre Auger Observatory. The rate of events shows a ~10% seasonal modulation and ~2% diurnal one. We find that the observed behaviour is explained by a model including the effects associated with the variations of pressure and density. The former affects the longitudinal development of air showers while the latter influences the Moliere radius and hence the lateral distribution of the shower particles. The model is validated with full simulations of extensive air showers using atmospheric profiles measured at the site of the Pierre Auger Observatory.Comment: 24 pages, 9 figures, accepted for publication in Astroparticle Physic

    The exposure of the hybrid detector of the Pierre Auger Observatory

    Get PDF
    The Pierre Auger Observatory is a detector for ultra-high energy cosmic rays. It consists of a surface array to measure secondary particles at ground level and a fluorescence detector to measure the development of air showers in the atmosphere above the array. The "hybrid" detection mode combines the information from the two subsystems. We describe the determination of the hybrid exposure for events observed by the fluorescence telescopes in coincidence with at least one water-Cherenkov detector of the surface array. A detailed knowledge of the time dependence of the detection operations is crucial for an accurate evaluation of the exposure. We discuss the relevance of monitoring data collected during operations, such as the status of the fluorescence detector, background light and atmospheric conditions, that are used in both simulation and reconstruction.Comment: Paper accepted by Astroparticle Physic
    corecore